Multi-level Distance Regularization for Deep Metric Learning
نویسندگان
چکیده
We propose a novel distance-based regularization method for deep metric learning called Multi-level Distance Regularization (MDR). MDR explicitly disturbs procedure by regularizing pairwise distances between embedding vectors into multiple levels that represents degree of similarity pair. In the training stage, model is trained with both and an existing loss function learning, simultaneously; two losses interfere objective each other, it makes process difficult. Moreover, prevents some examples from being ignored or overly influenced in process. These allow parameters network to be settle on local optima better generalization. Without bells whistles, simple Triplet achieves the-state-of-the-art performance various benchmark datasets: CUB-200-2011, Cars-196, Stanford Online Products, In-Shop Clothes Retrieval. extensively perform ablation studies its behaviors show effectiveness MDR. By easily adopting our MDR, previous approaches can improved generalization ability.
منابع مشابه
Multi-Modal Distance Metric Learning
Multi-modal data is dramatically increasing with the fast growth of social media. Learning a good distance measure for data with multiple modalities is of vital importance for many applications, including retrieval, clustering, classification and recommendation. In this paper, we propose an effective and scalable multi-modal distance metric learning framework. Based on the multi-wing harmonium ...
متن کاملDeep metric learning for multi-labelled radiographs
Many radiological studies can reveal the presence of several co-existing abnormalities, each one represented by a distinct visual pattern. In this article we address the problem of learning a distance metric for plain radiographs that captures a notion of “radiological similarity”: two chest radiographs are considered to be similar if they share similar abnormalities. Deep convolutional neural ...
متن کاملDeep Distance Metric Learning with Data Summarization
We present Deep Stochastic Neighbor Compression (DSNC), a framework to compress training data for instance-based methods (such as k-nearest neighbors). We accomplish this by inferring a smaller set of pseudo-inputs in a new feature space learned by a deep neural network. Our framework can equivalently be seen as jointly learning a nonlinear distance metric (induced by the deep feature space) an...
متن کاملDeep learning - Regularization
where, θ̃ is an estimator of θ coming from update equations or solution of optimization procedure. Variability in θ̃ is because of randomness in data and bias is due to model mismatch. Well known bias-variance trade offAs complexity of the model is increased model mismatch(bias) is decreased while variance in the prediction is increased because of randomness in training inputs. 4. Deep Learning s...
متن کاملBayesian Distance Metric Learning
This thesis explores the use of Bayesian distance metric learning (Bayes-dml) for the task of speaker verification using the i-vector feature representation. We propose a framework that explores the distance constraints between i-vector pairs from the same speaker and different speakers. With an approximation of the distance metric as a weighted covariance matrix of the top eigenvectors from th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2021
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v35i3.16277